Tensor Decomposition via Simultaneous Power Iteration (Supplementary Material)

نویسندگان

  • Po-An Wang
  • Chi-Jen Lu
چکیده

Proof. One can relate the singular values of the Hadamard product Z 2 to those of the Kronecker product Z ⊗ Z. In particular, as Z Z can be obtain from Z ⊗ Z by deleting some rows and columns, Lemma A.1 tells us that σmin(Z Z) ≥ σmin(Z ⊗ Z) and σmax(Z Z) ≤ σmax(Z ⊗ Z). Then the lemma follows as the Kronecker product Z⊗Z is known to have the property that σmin(Z⊗ Z) = (σmin(Z)) 2 and σmax(Z ⊗ Z) = (σmax(Z)).

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Tensor Decomposition via Simultaneous Power Iteration

Tensor decomposition is an important problem with many applications across several disciplines, and a popular approach for this problem is the tensor power method. However, previous works with theoretical guarantee based on this approach can only find the top eigenvectors one after one, unlike the case for matrices. In this paper, we show how to find the eigenvectors simultaneously with the hel...

متن کامل

Guaranteed Non-Orthogonal Tensor Decomposition via Alternating Rank-1 Updates

In this paper, we provide local and global convergence guarantees for recovering CP (Candecomp/Parafac) tensor decomposition. The main step of the proposed algorithm is a simple alternating rank-1 update which is the alternating version of the tensor power iteration adapted for asymmetric tensors. Local convergence guarantees are established for third order tensors of rank k in d dimensions, wh...

متن کامل

Fast and Guaranteed Tensor Decomposition via Sketching

Tensor CANDECOMP/PARAFAC (CP) decomposition has wide applications in statistical learning of latent variable models and in data mining. In this paper, we propose fast and randomized tensor CP decomposition algorithms based on sketching. We build on the idea of count sketches, but introduce many novel ideas which are unique to tensors. We develop novel methods for randomized computation of tenso...

متن کامل

Tensor Decompositions for Learning Latent Variable Models Report Title

This work considers a computationally and statistically e?cient parameter estimation method for a wide class of latent variable models—including Gaussian mixture models, hidden Markov models, and latent Dirichlet allocation—which exploits a certain tensor structure in their loworder observable moments (typically, of secondand third-order). Speci?cally, parameter estimation is reduced to the pro...

متن کامل

Online and Differentially-Private Tensor Decomposition

Tensor decomposition is an important tool for big data analysis. In this paper,we resolve many of the key algorithmic questions regarding robustness, memoryefficiency, and differential privacy of tensor decomposition. We propose simplevariants of the tensor power method which enjoy these strong properties. We presentthe first guarantees for online tensor power method which has a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017